Bayesian Sparse Tucker Models for Dimension Reduction and Tensor Completion

نویسندگان

  • Qibin Zhao
  • Liqing Zhang
  • Andrzej Cichocki
چکیده

Tucker decomposition is the cornerstone of modern machine learning on tensorial data analysis, which have attracted considerable attention for multiway feature extraction, compressive sensing, and tensor completion. The most challenging problem is related to determination of model complexity (i.e., multilinear rank), especially when noise and missing data are present. In addition, existing methods cannot take into account uncertainty information of latent factors, resulting in low generalization performance. To address these issues, we present a class of probabilistic generative Tucker models for tensor decomposition and completion with structural sparsity over multilinear latent space. To exploit structural sparse modeling, we introduce two group sparsity inducing priors by hierarchial representation of Laplace and Student-t distributions, which facilitates fully posterior inference. For model learning, we derived variational Bayesian inferences over all model (hyper)parameters, and developed efficient and scalable algorithms based on multilinear operations. Our methods can automatically adapt model complexity and infer an optimal multilinear rank by the principle of maximum lower bound of model evidence. Experimental results and comparisons on synthetic, chemometrics and neuroimaging data demonstrate remarkable performance of our models for recovering ground-truth of multilinear rank and missing entries.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Tensor Decompositions and Sparse Log-linear Models.

Contingency table analysis routinely relies on log-linear models, with latent structure analysis providing a common alternative. Latent structure models lead to a reduced rank tensor factorization of the probability mass function for multivariate categorical data, while log-linear models achieve dimensionality reduction through sparsity. Little is known about the relationship between these noti...

متن کامل

Near Optimal Sketching of Low-Rank Tensor Regression

We study the least squares regression problem min Θ∈S D,R ‖AΘ− b‖2, where S D,R is the set of Θ for which Θ = ∑R r=1θ (r) 1 ◦ · · · ◦ θ (r) D for vectors θ (r) d ∈ Rpd for all r ∈ [R] and d ∈ [D], and ◦ denotes the outer product of vectors. That is, Θ is a low-dimensional, lowrank tensor. This is motivated by the fact that the number of parameters in Θ is only R ·Dd=1pd , which is significantly...

متن کامل

Vectorial Dimension Reduction for Tensors Based on Bayesian Inference

Dimensionality reduction for high-order tensors is a challenging problem. In conventional approaches, higher order tensors are “vectorized” via Tucker decomposition to obtain lower order tensors. This will destroy the inherent high-order structures or resulting in undesired tensors, respectively. This paper introduces a probabilistic vectorial dimensionality reduction model for tensorial data. ...

متن کامل

Sparse Higher-Order Principal Components Analysis

Traditional tensor decompositions such as the CANDECOMP / PARAFAC (CP) and Tucker decompositions yield higher-order principal components that have been used to understand tensor data in areas such as neuroimaging, microscopy, chemometrics, and remote sensing. Sparsity in high-dimensional matrix factorizations and principal components has been well-studied exhibiting many benefits; less attentio...

متن کامل

Regularized Tensor Factorizations and Higher-Order Principal Components Analysis

High-dimensional tensors or multi-way data are becoming prevalent in areas such as biomedical imaging, chemometrics, networking and bibliometrics. Traditional approaches to finding lower dimensional representations of tensor data include flattening the data and applying matrix factorizations such as principal components analysis (PCA) or employing tensor decompositions such as the CANDECOMP / P...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • CoRR

دوره abs/1505.02343  شماره 

صفحات  -

تاریخ انتشار 2015